48 research outputs found

    Some aspects of queueing and storage processes : a thesis in partial fulfilment of the requirements for the degree of Master of Science in Statistics at Massey University

    Get PDF
    In this study the nature of systems consisting of a single queue are first considered. Attention is then drawn to an analogy between such systems and storage systems. A development of the single queue viz queues with feedback is considered after first considering feedback processes in general. The behaviour of queues, some with feedback loops, combined into networks is then considered. Finally, the application of such networks to the analysis of interconnected reservoir systems is considered and the conclusion drawn that such analytic methods complement the more recently developed mathematical programming methods by providing analytic solutions for sub systems behaviour and thus guiding the development of a system model

    Crop Updates 2005 - Farming Systems

    Get PDF
    This session covers forty four papers from different authors: PLENARY 1. 2005 Outlook, David Stephens and Nicola Telcik, Department of Agriculture FERTILITY AND NUTRITION 2. The effect of higher nitrogen fertiliser prices on rotation and fertiliser strategies in cropping systems, Ross Kingwell, Department of Agriculture and University of Western Australia 3. Stubble management: The short and long term implications for crop nutrition and soil fertility, Wayne Pluske, Nutrient Management Systems and Bill Bowden, Department of Agriculture 4. Stubble management: The pros and cons of different methods, Bill Bowden, Department of Agriculture, Western Australia and Mike Collins, WANTFA 5. Effect of stubble burning and seasonality on microbial processes and nutrient recycling, Frances Hoyle, The University of Western Australia 6. Soil biology and crop production in Western Australian farming systems, D.V. Murphy, N. Milton, M. Osman, F.C. Hoyle, L.K Abbott, W.R. Cookson and S. Darmawanto, The University of Western Australia 7. Urea is as effective as CAN when no rain for 10 days, Bill Crabtree, Crabtree Agricultural Consulting 8. Fertiliser (N,P,S,K) and lime requirements for wheat production in the Merredin district, Geoff Anderson, Department of Agriculture and Darren Kidson, Summit Fertilizers 9. Trace element applications: Up-front verses foliar? Bill Bowden and Ross Brennan, Department of Agriculture 10. Fertcare®, Environmental Product Stewardship and Advisor Standards for thee Fertiliser Industry, Nick Drew, Fertilizer Industry Federation of Australia (FIFA) SOIL AND LAND MANAGEMENT 11. Species response to row spacing, density and nutrition, Bill Bowden, Craig Scanlan, Lisa Sherriff, Bob French and Reg Lunt, Department of Agriculture 12. Investigation into the influence of row orientation in lupin crops, Jeff Russell, Department of Agriculture and Angie Roe, Farm Focus Consultants 13. Deriving variable rate management zones for crops, Ian Maling, Silverfox Solutions and Matthew Adams, DLI 14. In a world of Precision Agriculture, weigh trailers are not passé, Jeff Russell, Department of Agriculture 15. Cover crop management to combat ryegrass resistance and improve yields, Jeff Russell, Department of Agriculture and Angie Roe, Farm Focus Consultants 16. ARGT home page, the place to find information on annual ryegrass toxicity on the web, Dr George Yan, BART Pty Ltd 17. Shallow leading tine (SLT) ripper significantly reduces draft force, improves soil tilth and allows even distribution of subsoil ameliorants, Mohammad Hamza, Glen Riethmuller and Wal Anderson, Department of Agriculture PASTURE ANS SUMMER CROP SYSTEMS 18. New annual pasture legumes for Mediteranean farming systems, Angelo Loi, Phil Nichols, Clinton Revell and David Ferris, Department of Agriculture 19. How sustainable are phase rotations with Lucerne? Phil Ward, CSIRO Plant Industry 20. Management practicalities of summer cropping, Andrea Hills and Sally-Anne Penny, Department of Agriculture 21. Rainfall zone determines the effect of summer crops on winter yields, Andrea Hills, Sally-Anne Penny and David Hall, Department of Agriculture 22. Summer crops and water use, Andrea Hills, Sally-Anne Penny and David Hall, Department of Agriculture, and Michael Robertson and Don Gaydon, CSIRO Brisbane 23. Risk analysis of sorgum cropping, Andrea Hills and Sally-Anne Penny, Department of Agriculture, and Dr Michael Robertson and Don Gaydon, CSIRO Brisbane FARMER DECISION SUPPORT AND ADOPTION 24. Variety release and End Point Royalties – a new system? Tress Walmsley, Department of Agriculture 25. Farming system analaysis using the STEP Tool, Caroline Peek and Megan Abrahams, Department of Agriculture 26. The Leakage Calculator: A simple tool for groundwater recharge assessment, Paul Raper, Department of Agriculture 27. The cost of Salinity Calculator – your tool to assessing the profitability of salinity management options, Richard O’Donnell and Trevor Lacey, Department of Agriculture 28. Climate decision support tools, Meredith Fairbanks and David Tennant, Department of Agriculture 29. Horses for courses – using the best tools to manage climate risk, Cameron Weeks, Mingenew-Irwin Group/Planfarm and Richard Quinlan, Planfarm Agronomy 30. Use of seasonal outlook for making N decisions in Merredin, Meredith Fairbanks and Alexandra Edward, Department of Agriculture 31. Forecasts and profits, Benefits or bulldust? Chris Carter and Doug Hamilton, Department of Agriculture 32. A tool to estimate fixed and variable header and tractor depreciation costs, Peter Tozer, Department of Agriculture 33. Partners in grain: ‘Putting new faces in new places’, Renaye Horne, Department of Agriculture 34. Results from the Grower group Alliance, Tracey Gianatti, Grower Group Alliance 35. Local Farmer Group Network – farming systems research opportunities through local groups, Paul Carmody, Local Farmer Group Network GREENHOUSE GAS AND CLIMATE CHANGE 36. Changing rainfall patterns in the grainbelt, Ian Foster, Department of Agriculture 37. Vulnerability of broadscale agriculture to the impacts of climate change, Michele John, CSIRO (formerly Department of Agriculture) and Ross George, Department of Agriculture 38. Impacts of climate change on wheat yield at Merredin, Imma Farré and Ian Foster, Department of Agriculture 39. Climate change, land use suitability and water security, Ian Kininmonth, Dennis van Gool and Neil Coles, Department of Agriculture 40. Nitrous oxide emissions from cropping systems, Bill Porter, Department of Agriculture, Louise Barton, University of Western Australia 41. The potential of greenhouse sinks to underwrite improved land management in Western Australia, Richard Harper and Peter Ritson, CRC for Greenhouse Accounting and Forest Products Commission, Tony Beck, Tony Beck Consulting Services, Chris Mitchell and Michael Hill, CRC for Greenhouse Accounting 42. Removing uncertainty from greenhouse emissions, Fiona Barker-Reid, Will Gates, Ken Wilson and Rob Baigent, Department of Primary Industries - Victoria and CRC for Greenhouse Accounting (CRCGA), and Ian Galbally, Mick Meyer and Ian Weeks, CSIRO Atmospheric Research and CRCGA 43. Greenhouse in Agriculture Program (GIA), Traci Griffin, CRC for Greenhouse Accounting 44. Grains Greenhouse Accounting framework, D. Rodriguez, M. Probust, M. Meyers, D. Chen, A. Bennett, W. Strong, R. Nussey, I. Galbally and M. Howden CONTACT DETAILS FOR PRINCIPAL AUTHOR

    Procalcitonin Is Not a Reliable Biomarker of Bacterial Coinfection in People With Coronavirus Disease 2019 Undergoing Microbiological Investigation at the Time of Hospital Admission

    Get PDF
    Abstract Admission procalcitonin measurements and microbiology results were available for 1040 hospitalized adults with coronavirus disease 2019 (from 48 902 included in the International Severe Acute Respiratory and Emerging Infections Consortium World Health Organization Clinical Characterisation Protocol UK study). Although procalcitonin was higher in bacterial coinfection, this was neither clinically significant (median [IQR], 0.33 [0.11–1.70] ng/mL vs 0.24 [0.10–0.90] ng/mL) nor diagnostically useful (area under the receiver operating characteristic curve, 0.56 [95% confidence interval, .51–.60]).</jats:p

    Implementation of corticosteroids in treating COVID-19 in the ISARIC WHO Clinical Characterisation Protocol UK:prospective observational cohort study

    Get PDF
    BACKGROUND: Dexamethasone was the first intervention proven to reduce mortality in patients with COVID-19 being treated in hospital. We aimed to evaluate the adoption of corticosteroids in the treatment of COVID-19 in the UK after the RECOVERY trial publication on June 16, 2020, and to identify discrepancies in care. METHODS: We did an audit of clinical implementation of corticosteroids in a prospective, observational, cohort study in 237 UK acute care hospitals between March 16, 2020, and April 14, 2021, restricted to patients aged 18 years or older with proven or high likelihood of COVID-19, who received supplementary oxygen. The primary outcome was administration of dexamethasone, prednisolone, hydrocortisone, or methylprednisolone. This study is registered with ISRCTN, ISRCTN66726260. FINDINGS: Between June 17, 2020, and April 14, 2021, 47 795 (75·2%) of 63 525 of patients on supplementary oxygen received corticosteroids, higher among patients requiring critical care than in those who received ward care (11 185 [86·6%] of 12 909 vs 36 415 [72·4%] of 50 278). Patients 50 years or older were significantly less likely to receive corticosteroids than those younger than 50 years (adjusted odds ratio 0·79 [95% CI 0·70–0·89], p=0·0001, for 70–79 years; 0·52 [0·46–0·58], p80 years), independent of patient demographics and illness severity. 84 (54·2%) of 155 pregnant women received corticosteroids. Rates of corticosteroid administration increased from 27·5% in the week before June 16, 2020, to 75–80% in January, 2021. INTERPRETATION: Implementation of corticosteroids into clinical practice in the UK for patients with COVID-19 has been successful, but not universal. Patients older than 70 years, independent of illness severity, chronic neurological disease, and dementia, were less likely to receive corticosteroids than those who were younger, as were pregnant women. This could reflect appropriate clinical decision making, but the possibility of inequitable access to life-saving care should be considered. FUNDING: UK National Institute for Health Research and UK Medical Research Council

    Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field

    Non-steroidal anti-inflammatory drug use and outcomes of COVID-19 in the ISARIC Clinical Characterisation Protocol UK cohort: a matched, prospective cohort study.

    Get PDF
    Background: Early in the pandemic it was suggested that pre-existing use of non-steroidal anti-inflammatory drugs (NSAIDs) could lead to increased disease severity in patients with COVID-19. NSAIDs are an important analgesic, particularly in those with rheumatological disease, and are widely available to the general public without prescription. Evidence from community studies, administrative data, and small studies of hospitalised patients suggest NSAIDs are not associated with poorer COVID-19 outcomes. We aimed to characterise the safety of NSAIDs and identify whether pre-existing NSAID use was associated with increased severity of COVID-19 disease. Methods: This prospective, multicentre cohort study included patients of any age admitted to hospital with a confirmed or highly suspected SARS-CoV-2 infection leading to COVID-19 between Jan 17 and Aug 10, 2020. The primary outcome was in-hospital mortality, and secondary outcomes were disease severity at presentation, admission to critical care, receipt of invasive ventilation, receipt of non-invasive ventilation, use of supplementary oxygen, and acute kidney injury. NSAID use was required to be within the 2 weeks before hospital admission. We used logistic regression to estimate the effects of NSAIDs and adjust for confounding variables. We used propensity score matching to further estimate effects of NSAIDS while accounting for covariate differences in populations. Results: Between Jan 17 and Aug 10, 2020, we enrolled 78 674 patients across 255 health-care facilities in England, Scotland, and Wales. 72 179 patients had death outcomes available for matching; 40 406 (56·2%) of 71 915 were men, 31 509 (43·8%) were women. In this cohort, 4211 (5·8%) patients were recorded as taking systemic NSAIDs before admission to hospital. Following propensity score matching, balanced groups of NSAIDs users and NSAIDs non-users were obtained (4205 patients in each group). At hospital admission, we observed no significant differences in severity between exposure groups. After adjusting for explanatory variables, NSAID use was not associated with worse in-hospital mortality (matched OR 0·95, 95% CI 0·84–1·07; p=0·35), critical care admission (1·01, 0·87–1·17; p=0·89), requirement for invasive ventilation (0·96, 0·80–1·17; p=0·69), requirement for non-invasive ventilation (1·12, 0·96–1·32; p=0·14), requirement for oxygen (1·00, 0·89–1·12; p=0·97), or occurrence of acute kidney injury (1·08, 0·92–1·26; p=0·33). Interpretation: NSAID use is not associated with higher mortality or increased severity of COVID-19. Policy makers should consider reviewing issued advice around NSAID prescribing and COVID-19 severity. Funding: National Institute for Health Research and Medical Research Council

    Co-infections, secondary infections, and antimicrobial use in patients hospitalised with COVID-19 during the first pandemic wave from the ISARIC WHO CCP-UK study: a multicentre, prospective cohort study

    Get PDF
    Background: Microbiological characterisation of co-infections and secondary infections in patients with COVID-19 is lacking, and antimicrobial use is high. We aimed to describe microbiologically confirmed co-infections and secondary infections, and antimicrobial use, in patients admitted to hospital with COVID-19. Methods: The International Severe Acute Respiratory and Emerging Infections Consortium (ISARIC) WHO Clinical Characterisation Protocol UK (CCP-UK) study is an ongoing, prospective cohort study recruiting inpatients from 260 hospitals in England, Scotland, and Wales, conducted by the ISARIC Coronavirus Clinical Characterisation Consortium. Patients with a confirmed or clinician-defined high likelihood of SARS-CoV-2 infection were eligible for inclusion in the ISARIC WHO CCP-UK study. For this specific study, we excluded patients with a recorded negative SARS-CoV-2 test result and those without a recorded outcome at 28 days after admission. Demographic, clinical, laboratory, therapeutic, and outcome data were collected using a prespecified case report form. Organisms considered clinically insignificant were excluded. Findings: We analysed data from 48 902 patients admitted to hospital between Feb 6 and June 8, 2020. The median patient age was 74 years (IQR 59–84) and 20 786 (42·6%) of 48 765 patients were female. Microbiological investigations were recorded for 8649 (17·7%) of 48 902 patients, with clinically significant COVID-19-related respiratory or bloodstream culture results recorded for 1107 patients. 762 (70·6%) of 1080 infections were secondary, occurring more than 2 days after hospital admission. Staphylococcus aureus and Haemophilus influenzae were the most common pathogens causing respiratory co-infections (diagnosed ≤2 days after admission), with Enterobacteriaceae and S aureus most common in secondary respiratory infections. Bloodstream infections were most frequently caused by Escherichia coli and S aureus. Among patients with available data, 13 390 (37·0%) of 36 145 had received antimicrobials in the community for this illness episode before hospital admission and 39 258 (85·2%) of 46 061 patients with inpatient antimicrobial data received one or more antimicrobials at some point during their admission (highest for patients in critical care). We identified frequent use of broad-spectrum agents and use of carbapenems rather than carbapenem-sparing alternatives. Interpretation: In patients admitted to hospital with COVID-19, microbiologically confirmed bacterial infections are rare, and more likely to be secondary infections. Gram-negative organisms and S aureus are the predominant pathogens. The frequency and nature of antimicrobial use are concerning, but tractable targets for stewardship interventions exist. Funding: National Institute for Health Research (NIHR), UK Medical Research Council, Wellcome Trust, UK Department for International Development, Bill &amp; Melinda Gates Foundation, EU Platform for European Preparedness Against (Re-)emerging Epidemics, NIHR Health Protection Research Unit (HPRU) in Emerging and Zoonotic Infections at University of Liverpool, and NIHR HPRU in Respiratory Infections at Imperial College London

    Co-infections, secondary infections, and antimicrobial use in patients hospitalised with COVID-19 during the first pandemic wave from the ISARIC WHO CCP-UK study: a multicentre, prospective cohort study

    Get PDF
    Background: Microbiological characterisation of co-infections and secondary infections in patients with COVID-19 is lacking, and antimicrobial use is high. We aimed to describe microbiologically confirmed co-infections and secondary infections, and antimicrobial use, in patients admitted to hospital with COVID-19. Methods: The International Severe Acute Respiratory and Emerging Infections Consortium (ISARIC) WHO Clinical Characterisation Protocol UK (CCP-UK) study is an ongoing, prospective cohort study recruiting inpatients from 260 hospitals in England, Scotland, and Wales, conducted by the ISARIC Coronavirus Clinical Characterisation Consortium. Patients with a confirmed or clinician-defined high likelihood of SARS-CoV-2 infection were eligible for inclusion in the ISARIC WHO CCP-UK study. For this specific study, we excluded patients with a recorded negative SARS-CoV-2 test result and those without a recorded outcome at 28 days after admission. Demographic, clinical, laboratory, therapeutic, and outcome data were collected using a prespecified case report form. Organisms considered clinically insignificant were excluded. Findings: We analysed data from 48 902 patients admitted to hospital between Feb 6 and June 8, 2020. The median patient age was 74 years (IQR 59–84) and 20 786 (42·6%) of 48 765 patients were female. Microbiological investigations were recorded for 8649 (17·7%) of 48 902 patients, with clinically significant COVID-19-related respiratory or bloodstream culture results recorded for 1107 patients. 762 (70·6%) of 1080 infections were secondary, occurring more than 2 days after hospital admission. Staphylococcus aureus and Haemophilus influenzae were the most common pathogens causing respiratory co-infections (diagnosed ≤2 days after admission), with Enterobacteriaceae and S aureus most common in secondary respiratory infections. Bloodstream infections were most frequently caused by Escherichia coli and S aureus. Among patients with available data, 13 390 (37·0%) of 36 145 had received antimicrobials in the community for this illness episode before hospital admission and 39 258 (85·2%) of 46 061 patients with inpatient antimicrobial data received one or more antimicrobials at some point during their admission (highest for patients in critical care). We identified frequent use of broad-spectrum agents and use of carbapenems rather than carbapenem-sparing alternatives. Interpretation: In patients admitted to hospital with COVID-19, microbiologically confirmed bacterial infections are rare, and more likely to be secondary infections. Gram-negative organisms and S aureus are the predominant pathogens. The frequency and nature of antimicrobial use are concerning, but tractable targets for stewardship interventions exist. Funding: National Institute for Health Research (NIHR), UK Medical Research Council, Wellcome Trust, UK Department for International Development, Bill &amp; Melinda Gates Foundation, EU Platform for European Preparedness Against (Re-)emerging Epidemics, NIHR Health Protection Research Unit (HPRU) in Emerging and Zoonotic Infections at University of Liverpool, and NIHR HPRU in Respiratory Infections at Imperial College London

    Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: advancing efficient methodologies through community partnerships and team science

    Get PDF
    It is well documented that the majority of adults, children and families in need of evidence-based behavioral health interventionsi do not receive them [1, 2] and that few robust empirically supported methods for implementing evidence-based practices (EBPs) exist. The Society for Implementation Research Collaboration (SIRC) represents a burgeoning effort to advance the innovation and rigor of implementation research and is uniquely focused on bringing together researchers and stakeholders committed to evaluating the implementation of complex evidence-based behavioral health interventions. Through its diverse activities and membership, SIRC aims to foster the promise of implementation research to better serve the behavioral health needs of the population by identifying rigorous, relevant, and efficient strategies that successfully transfer scientific evidence to clinical knowledge for use in real world settings [3]. SIRC began as a National Institute of Mental Health (NIMH)-funded conference series in 2010 (previously titled the “Seattle Implementation Research Conference”; $150,000 USD for 3 conferences in 2011, 2013, and 2015) with the recognition that there were multiple researchers and stakeholdersi working in parallel on innovative implementation science projects in behavioral health, but that formal channels for communicating and collaborating with one another were relatively unavailable. There was a significant need for a forum within which implementation researchers and stakeholders could learn from one another, refine approaches to science and practice, and develop an implementation research agenda using common measures, methods, and research principles to improve both the frequency and quality with which behavioral health treatment implementation is evaluated. SIRC’s membership growth is a testament to this identified need with more than 1000 members from 2011 to the present.ii SIRC’s primary objectives are to: (1) foster communication and collaboration across diverse groups, including implementation researchers, intermediariesi, as well as community stakeholders (SIRC uses the term “EBP champions” for these groups) – and to do so across multiple career levels (e.g., students, early career faculty, established investigators); and (2) enhance and disseminate rigorous measures and methodologies for implementing EBPs and evaluating EBP implementation efforts. These objectives are well aligned with Glasgow and colleagues’ [4] five core tenets deemed critical for advancing implementation science: collaboration, efficiency and speed, rigor and relevance, improved capacity, and cumulative knowledge. SIRC advances these objectives and tenets through in-person conferences, which bring together multidisciplinary implementation researchers and those implementing evidence-based behavioral health interventions in the community to share their work and create professional connections and collaborations
    corecore